26 research outputs found
Study of fault-tolerant software technology
Presented is an overview of the current state of the art of fault-tolerant software and an analysis of quantitative techniques and models developed to assess its impact. It examines research efforts as well as experience gained from commercial application of these techniques. The paper also addresses the computer architecture and design implications on hardware, operating systems and programming languages (including Ada) of using fault-tolerant software in real-time aerospace applications. It concludes that fault-tolerant software has progressed beyond the pure research state. The paper also finds that, although not perfectly matched, newer architectural and language capabilities provide many of the notations and functions needed to effectively and efficiently implement software fault-tolerance
Recommended from our members
An Evaluation of the Performance of the Twentieth Century Reanalysis Version 3
The performance of a new historical reanalysis, the NOAA–CIRES–DOE Twentieth Century Reanalysis version 3 (20CRv3), is evaluated via comparisons with other reanalyses and independent observations. This dataset provides global, 3-hourly estimates of the atmosphere from 1806 to 2015 by assimilating only surface pressure observations and prescribing sea surface temperature, sea ice concentration, and radiative forcings. Comparisons with independent observations, other reanalyses, and satellite products suggest that 20CRv3 can reliably produce atmospheric estimates on scales ranging from weather events to long-term climatic trends. Not only does 20CRv3 recreate a “best estimate” of the weather, including extreme events, it also provides an estimate of its confidence through the use of an ensemble. Surface pressure statistics suggest that these confidence estimates are reliable. Comparisons with independent upper-air observations in the Northern Hemisphere demonstrate that 20CRv3 has skill throughout the twentieth century. Upper-air fields from 20CRv3 in the late twentieth century and early twenty-first century correlate well with full-input reanalyses, and the correlation is predicted by the confidence fields from 20CRv3. The skill of analyzed 500-hPa geopotential heights from 20CRv3 for 1979–2015 is comparable to that of modern operational 3–4-day forecasts. Finally, 20CRv3 performs well on climate time scales. Long time series and multidecadal averages of mass, circulation, and precipitation fields agree well with modern reanalyses and station- and satellite-based products. 20CRv3 is also able to capture trends in tropospheric-layer temperatures that correlate well with independent products in the twentieth century, placing recent trends in a longer historical context
Scientific challenges of convective-scale numerical weather prediction
Numerical weather prediction (NWP) models are increasing in resolution and becoming capable of explicitly representing individual convective storms. Is this increase in resolution leading to better forecasts? Unfortunately, we do not have sufficient theoretical understanding about this weather regime to make full use of these NWPs.
After extensive efforts over the course of a decade, convective–scale weather forecasts with horizontal grid spacings of 1–5 km are now operational at national weather services around the world, accompanied by ensemble prediction systems (EPSs). However, though already operational, the capacity of forecasts for this scale is still to be fully exploited by overcoming the fundamental difficulty in prediction: the fully three–dimensional and turbulent nature of the atmosphere. The prediction of this scale is totally different from that of the synoptic scale (103 km) with slowly–evolving semi–geostrophic dynamics and relatively long predictability on the order of a few days.
Even theoretically, very little is understood about the convective scale compared to our extensive knowledge of the synoptic-scale weather regime as a partial–differential equation system, as well as in terms of the fluid mechanics, predictability, uncertainties, and stochasticity. Furthermore, there is a requirement for a drastic modification of data assimilation methodologies, physics (e.g., microphysics), parameterizations, as well as the numerics for use at the convective scale. We need to focus on more fundamental theoretical issues: the Liouville principle and Bayesian probability for probabilistic forecasts; and more fundamental turbulence research to provide robust numerics for the full variety of turbulent flows.
The present essay reviews those basic theoretical challenges as comprehensibly as possible. The breadth of the problems that we face is a challenge in itself: an attempt to reduce these into a single critical agenda should be avoided
Business ethics competencies research: implications for Canadian practitioners
This paper describes a proposed framework of knowledge, skills, abilities, and other characteristics (KSAOs) that a practitioner who is competent in business ethics, compliance, or integrity should possess. These competencies may be leveraged as key input to selecting content for an institutionalized business ethics training program. The focus in this paper is on the management problem of 'What competencies are important for job performance of business ethics practitioners'. Phase I consisted of developing a provisional taxonomy of business ethics competencies and Phase II involved academic and industry practitioners implicated in business ethics to validate the conceptually developed provisional taxonomy of business ethics competencies to eventually make recommendations regarding the selection of business ethics training content. The contribution to the business ethics competency-based management knowledge that is presented in this paper is a proposed business ethics competency model and the implications of this model for Canadian practitioners are discussed